Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add PSA stack #424

Merged
merged 4 commits into from
Oct 1, 2024
Merged

feat: add PSA stack #424

merged 4 commits into from
Oct 1, 2024

Conversation

alanshaw
Copy link
Member

Adds a temporary stack for the old Pinning Service API (PSA) data.

The stack creates 2 functions:

  • hash given a root CID, find a CAR file in one of the configured buckets and calculate it's hash.
  • download given a root CID, find a CAR file in one of the configured buckets and return a signed URL allowing temporary access to the data.

This is for the PSA migration tool in console.

  1. Users list pinned root CIDs.
  2. Users get hash of CAR file containing pinned data by calling hash.
  3. Users blob/add CAR file.
  4. (Maybe) users need to upload the CAR, so they call download to get a signed URL and then download the data, and upload it to Storacha.

@hannahhoward
Copy link
Member

hahhah PSA
image

Copy link
Member

@hannahhoward hannahhoward left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM though my personal knowledge of old web3 storage is not sufficient to tell if this is the right place to look for car files + hashes.

psa/config.js Outdated
export const buckets = [
{
name: process.env.S3_DOTSTORAGE_0_BUCKET_NAME,
region: process.env.S3_DOTSTORAGE_0_BUCKET_REGION,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this just the names we have 0 & 1?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, for historical reasons we had 2 buckets that received some pinning service data. They are in different regions.

}

try {
const url = await getDownloadURL(buckets, root)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

buckes doesn't need an import?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It certainly does!

downloadFunction.attachPermissions(['s3:GetObject'])

stack.addOutputs({
hashFunctionURL: hashFunction.url,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do these need a domain name?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, good shout - otherwise they will change when we deploy.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Going to have to setup a redirect for this manually since we want to allow execution of >30s (hashing big files). The AWS API gateway has a limit of 30s, which I think is now configurable, but I don't think our SST version supports it yet.

@seed-deploy seed-deploy bot temporarily deployed to pr424 September 30, 2024 13:22 Inactive
Copy link

seed-deploy bot commented Sep 30, 2024

View stack outputs

@alanshaw alanshaw merged commit 120908a into main Oct 1, 2024
3 checks passed
@alanshaw alanshaw deleted the feat/add-psa-stack branch October 1, 2024 09:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants